Goto

Collaborating Authors

 exchangeable neural ode


Exchangeable Neural ODE for Set Modeling

Neural Information Processing Systems

Reasoning over an instance composed of a set of vectors, like a point cloud, requires that one accounts for intra-set dependent features among elements. However, since such instances are unordered, the elements' features should remain unchanged when the input's order is permuted. This property, permutation equivariance, is a challenging constraint for most neural architectures. While recent work has proposed global pooling and attention-based solutions, these may be limited in the way that intradependencies are captured in practice. In this work we propose a more general formulation to achieve permutation equivariance through ordinary differential equations (ODE). Our proposed module, Exchangeable Neural ODE (ExNODE), can be seamlessly applied for both discriminative and generative tasks. We also extend set modeling in the temporal dimension and propose a VAE based model for temporal set modeling. Extensive experiments demonstrate the efficacy of our method over strong baselines.




Exchangeable Neural ODE for Set Modeling

Neural Information Processing Systems

Reasoning over an instance composed of a set of vectors, like a point cloud, requires that one accounts for intra-set dependent features among elements. However, since such instances are unordered, the elements' features should remain unchanged when the input's order is permuted. This property, permutation equivariance, is a challenging constraint for most neural architectures. While recent work has proposed global pooling and attention-based solutions, these may be limited in the way that intradependencies are captured in practice. In this work we propose a more general formulation to achieve permutation equivariance through ordinary differential equations (ODE).


Review for NeurIPS paper: Exchangeable Neural ODE for Set Modeling

Neural Information Processing Systems

Weaknesses: - Even though the work is theoretically sound, and the authors do mention that drift functions used in the neural ODE framework have to be Lipschitz continuous, there is no mention if and how Lipschitz continuity can be achieved in general, and if it is achieved for the specific parametrizations (deep set, set transformer) used in this work. More specifically, the vanilla self-attention of (set-)transformer is provably NOT Lipschitz continuous as shown in Kim et. Now, I do realize that the paper I am referencing is an arxiv submission that appeared after the NeurIPS application deadline, and I do not hold this against the authors. However, the authors do not even mention that there was no/authors were not aware of any results on Lipschitz continuity of the used modules. Moreover, deep set need not be Lipschitz continuous if no measures to ensure it are taken.


Exchangeable Neural ODE for Set Modeling

Neural Information Processing Systems

Reasoning over an instance composed of a set of vectors, like a point cloud, requires that one accounts for intra-set dependent features among elements. However, since such instances are unordered, the elements' features should remain unchanged when the input's order is permuted. This property, permutation equivariance, is a challenging constraint for most neural architectures. While recent work has proposed global pooling and attention-based solutions, these may be limited in the way that intradependencies are captured in practice. In this work we propose a more general formulation to achieve permutation equivariance through ordinary differential equations (ODE).


Exchangeable Neural ODE for Set Modeling

Li, Yang, Yi, Haidong, Bender, Christopher M., Shan, Siyuan, Oliva, Junier B.

arXiv.org Machine Learning

Reasoning over an instance composed of a set of vectors, like a point cloud, requires that one accounts for intra-set dependent features among elements. However, since such instances are unordered, the elements' features should remain unchanged when the input's order is permuted. This property, permutation equivariance, is a challenging constraint for most neural architectures. While recent work has proposed global pooling and attention-based solutions, these may be limited in the way that intradependencies are captured in practice. In this work we propose a more general formulation to achieve permutation equivariance through ordinary differential equations (ODE). Our proposed module, Exchangeable Neural ODE (ExNODE), can be seamlessly applied for both discriminative and generative tasks. We also extend set modeling in the temporal dimension and propose a VAE based model for temporal set modeling. Extensive experiments demonstrate the efficacy of our method over strong baselines.